575 research outputs found

    Hiding in the Particles: When Return-Oriented Programming Meets Program Obfuscation

    Full text link
    Largely known for attack scenarios, code reuse techniques at a closer look reveal properties that are appealing also for program obfuscation. We explore the popular return-oriented programming paradigm under this light, transforming program functions into ROP chains that coexist seamlessly with the surrounding software stack. We show how to build chains that can withstand popular static and dynamic deobfuscation approaches, evaluating the robustness and overheads of the design over common programs. The results suggest a significant amount of computational resources would be required to carry a deobfuscation attack for secret finding and code coverage goals.Comment: Published in the proceedings of DSN'21 (51st IEEE/IFIP Int. Conf. on Dependable Systems and Networks). Code and BibTeX entry available at https://github.com/pietroborrello/raindro

    CustomProcessingUnit: Reverse Engineering and Customization of Intel Microcode

    Get PDF
    Microcode provides an abstraction layer over the instruction set to decompose complex instructions into simpler micro-operations that can be more easily implemented in hardware. It is an essential optimization to simplify the design of x86 processors. However, introducing an additional layer of software beneath the instruction set poses security and reliability concerns. The microcode details are confidential to the manufacturers, preventing independent auditing or customization of the microcode. Moreover, microcode patches are signed and encrypted to prevent unauthorized patching and reverse engineering. However, recent research has recovered decrypted microcode and reverse-engineered read/write debug mechanisms on Intel Goldmont (Atom), making analysis and customization of microcode possible on a modern Intel microarchitecture. In this work, we present the first framework for static and dynamic analysis of Intel microcode. Building upon prior research, we reverse-engineer Goldmont microcode semantics and reconstruct the patching primitives for microcode customization. For static analysis, we implement a Ghidra processor module for decompilation and analysis of decrypted microcode. For dynamic analysis, we create a UEFI application that can trace and patch microcode to provide complete microcode control on Goldmont systems. Leveraging our framework, we reverse-engineer the confidential Intel microcode update algorithm and perform the first security analysis of its design and implementation. In three further case studies, we illustrate the potential security and performance benefits of microcode customization. We provide the first x86 Pointer Authentication Code (PAC) microcode implementation and its security evaluation, design and implement fast software breakpoints that are more than 1000x faster than standard breakpoints, and present constant-time microcode division, illustrating the potential security and performance benefits of microcode customization

    Practical Timing Side-Channel Attacks on Memory Compression

    Get PDF
    Compression algorithms have side channels due to their data-dependent operations. So far only the compression-ratio side channel was exploited, e.g., the compressed data size. In this paper, we present Decomp+Time, the first memory compression attack exploiting a timing side channel in compression algorithms. While Decomp+Time affects a much broader set of applications than prior work, a key challenge is precisely crafting attacker-controlled compression payloads to enable the attack with sufficient resolution. We develop an evolutionary fuzzer, Comprezzor, to find effective Decomp+Time payloads that optimize latency differences and find payloads that are so effective that decompression timing can even be exploited in remote Decomp+Time attacks across the Internet. Decomp+Time has a capacity of 9.73 kB/s locally, and 10.72 bit/min across the internet (14 hops, > 700 miles). Using Comprezzor, we develop attacks that leak data byte-by-byte in four different case studies: First, we leak 1.50 bit/min from Memcached on a remote server running a PHP application. Second, we leak database records with 2.69 bit/min from PostgreSQL, managed by a Python-Flask application, over the internet. Third, we leak secrets with 49.14 bit/min locally from ZRAM-compressed pages on Linux. Fourth, we leak internal heap pointers from the V8 engine within the Google Chrome browser on a system using ZRAM. This highlights the importance of re-evaluating the use of compression on sensitive data even if the application is only reachable via a remote interface

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Search for stop and higgsino production using diphoton Higgs boson decays

    Get PDF
    Results are presented of a search for a "natural" supersymmetry scenario with gauge mediated symmetry breaking. It is assumed that only the supersymmetric partners of the top-quark (stop) and the Higgs boson (higgsino) are accessible. Events are examined in which there are two photons forming a Higgs boson candidate, and at least two b-quark jets. In 19.7 inverse femtobarns of proton-proton collision data at sqrt(s) = 8 TeV, recorded in the CMS experiment, no evidence of a signal is found and lower limits at the 95% confidence level are set, excluding the stop mass below 360 to 410 GeV, depending on the higgsino mass

    Severe early onset preeclampsia: short and long term clinical, psychosocial and biochemical aspects

    Get PDF
    Preeclampsia is a pregnancy specific disorder commonly defined as de novo hypertension and proteinuria after 20 weeks gestational age. It occurs in approximately 3-5% of pregnancies and it is still a major cause of both foetal and maternal morbidity and mortality worldwide1. As extensive research has not yet elucidated the aetiology of preeclampsia, there are no rational preventive or therapeutic interventions available. The only rational treatment is delivery, which benefits the mother but is not in the interest of the foetus, if remote from term. Early onset preeclampsia (<32 weeks’ gestational age) occurs in less than 1% of pregnancies. It is, however often associated with maternal morbidity as the risk of progression to severe maternal disease is inversely related with gestational age at onset2. Resulting prematurity is therefore the main cause of neonatal mortality and morbidity in patients with severe preeclampsia3. Although the discussion is ongoing, perinatal survival is suggested to be increased in patients with preterm preeclampsia by expectant, non-interventional management. This temporising treatment option to lengthen pregnancy includes the use of antihypertensive medication to control hypertension, magnesium sulphate to prevent eclampsia and corticosteroids to enhance foetal lung maturity4. With optimal maternal haemodynamic status and reassuring foetal condition this results on average in an extension of 2 weeks. Prolongation of these pregnancies is a great challenge for clinicians to balance between potential maternal risks on one the eve hand and possible foetal benefits on the other. Clinical controversies regarding prolongation of preterm preeclamptic pregnancies still exist – also taking into account that preeclampsia is the leading cause of maternal mortality in the Netherlands5 - a debate which is even more pronounced in very preterm pregnancies with questionable foetal viability6-9. Do maternal risks of prolongation of these very early pregnancies outweigh the chances of neonatal survival? Counselling of women with very early onset preeclampsia not only comprises of knowledge of the outcome of those particular pregnancies, but also knowledge of outcomes of future pregnancies of these women is of major clinical importance. This thesis opens with a review of the literature on identifiable risk factors of preeclampsia

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Measurement of the Splitting Function in &ITpp &ITand Pb-Pb Collisions at root&ITsNN&IT=5.02 TeV

    Get PDF
    Data from heavy ion collisions suggest that the evolution of a parton shower is modified by interactions with the color charges in the dense partonic medium created in these collisions, but it is not known where in the shower evolution the modifications occur. The momentum ratio of the two leading partons, resolved as subjets, provides information about the parton shower evolution. This substructure observable, known as the splitting function, reflects the process of a parton splitting into two other partons and has been measured for jets with transverse momentum between 140 and 500 GeV, in pp and PbPb collisions at a center-of-mass energy of 5.02 TeV per nucleon pair. In central PbPb collisions, the splitting function indicates a more unbalanced momentum ratio, compared to peripheral PbPb and pp collisions.. The measurements are compared to various predictions from event generators and analytical calculations.Peer reviewe
    corecore